Hankel Matrix
   HOME

TheInfoList



OR:

In
linear algebra Linear algebra is the branch of mathematics concerning linear equations such as: :a_1x_1+\cdots +a_nx_n=b, linear maps such as: :(x_1, \ldots, x_n) \mapsto a_1x_1+\cdots +a_nx_n, and their representations in vector spaces and through matrices ...
, a Hankel matrix (or
catalecticant In mathematical invariant theory, the catalecticant of a form of even degree is a polynomial in its coefficients that vanishes when the form is a sum of an unusually small number of powers of linear forms. It was introduced by ; see . The word c ...
matrix), named after
Hermann Hankel Hermann Hankel (14 February 1839 – 29 August 1873) was a German mathematician. Having worked on mathematical analysis during his career, he is best known for introducing the Hankel transform and the Hankel matrix. Biography Hankel was born on ...
, is a square matrix in which each ascending skew-diagonal from left to right is constant, e.g.: \qquad\begin a & b & c & d & e \\ b & c & d & e & f \\ c & d & e & f & g \\ d & e & f & g & h \\ e & f & g & h & i \\ \end. More generally, a Hankel matrix is any n \times n matrix A of the form A = \begin a_ & a_ & a_ & \ldots & \ldots &a_ \\ a_ & a_2 & & & &\vdots \\ a_ & & & & & \vdots \\ \vdots & & & & & a_\\ \vdots & & & & a_& a_ \\ a_ & \ldots & \ldots & a_ & a_ & a_ \end. In terms of the components, if the i,j element of A is denoted with A_, and assuming i\le j, then we have A_ = A_ for all k = 0,...,j-i.


Properties

* The Hankel matrix is a
symmetric matrix In linear algebra, a symmetric matrix is a square matrix that is equal to its transpose. Formally, Because equal matrices have equal dimensions, only square matrices can be symmetric. The entries of a symmetric matrix are symmetric with ...
. * Let J_n be the n \times n
exchange matrix In mathematics, especially linear algebra, the exchange matrices (also called the reversal matrix, backward identity, or standard involutory permutation) are special cases of permutation matrices, where the 1 elements reside on the antidiagonal and ...
. If H is a m \times n Hankel matrix, then H = T J_n where T is a m \times n
Toeplitz matrix In linear algebra, a Toeplitz matrix or diagonal-constant matrix, named after Otto Toeplitz, is a matrix in which each descending diagonal from left to right is constant. For instance, the following matrix is a Toeplitz matrix: :\qquad\begin a & b ...
. ** If T is
real Real may refer to: Currencies * Brazilian real (R$) * Central American Republic real * Mexican real * Portuguese real * Spanish real * Spanish colonial real Music Albums * ''Real'' (L'Arc-en-Ciel album) (2000) * ''Real'' (Bright album) (2010) ...
symmetric, then H = T J_n will have the same
eigenvalue In linear algebra, an eigenvector () or characteristic vector of a linear transformation is a nonzero vector that changes at most by a scalar factor when that linear transformation is applied to it. The corresponding eigenvalue, often denoted ...
s as T up to sign. * The
Hilbert matrix In linear algebra, a Hilbert matrix, introduced by , is a square matrix with entries being the unit fractions : H_ = \frac. For example, this is the 5 × 5 Hilbert matrix: : H = \begin 1 & \frac & \frac & \frac & \frac \\ \frac & \frac & \f ...
is an example of a Hankel matrix.


Hankel operator

A Hankel operator on a Hilbert space is one whose matrix is a (possibly infinite) Hankel matrix with respect to an
orthonormal basis In mathematics, particularly linear algebra, an orthonormal basis for an inner product space ''V'' with finite dimension is a basis for V whose vectors are orthonormal, that is, they are all unit vectors and orthogonal to each other. For examp ...
. As indicated above, a Hankel Matrix is a matrix with constant values along its antidiagonals, which means that a Hankel matrix A must satisfy, for all rows i and columns j, (A_)_. Note that every entry A_ depends only on i+j. Let the corresponding Hankel Operator be H_\alpha. Given a Hankel matrix A, the corresponding Hankel operator is then defined as H_\alpha(u)= Au. We are often interested in Hankel operators H_\alpha: \ell^\left(\mathbb^ \cup\\right) \rightarrow \ell^\left(\mathbb^ \cup\\right) over the Hilbert space \ell^(\mathbf Z) , the space of square integrable bilateral
complex Complex commonly refers to: * Complexity, the behaviour of a system whose components interact in multiple ways so possible interactions are difficult to describe ** Complex system, a system composed of many components which may interact with each ...
sequence In mathematics, a sequence is an enumerated collection of objects in which repetitions are allowed and order matters. Like a set, it contains members (also called ''elements'', or ''terms''). The number of elements (possibly infinite) is calle ...
s. For any u \in \ell^(\mathbf Z), we have \, u\, _^ = \sum_^\left, u_\^ We are often interested in approximations of the Hankel operators, possibly by low-order operators. In order to approximate the output of the operator, we can use the spectral norm (operator 2-norm) to measure the error of our approximation. This suggests
singular value decomposition In linear algebra, the singular value decomposition (SVD) is a factorization of a real or complex matrix. It generalizes the eigendecomposition of a square normal matrix with an orthonormal eigenbasis to any \ m \times n\ matrix. It is re ...
as a possible technique to approximate the action of the operator. Note that the matrix A does not have to be finite. If it is infinite, traditional methods of computing individual singular vectors will not work directly. We also require that the approximation is a Hankel matrix, which can be shown with AAK theory. The
determinant In mathematics, the determinant is a scalar value that is a function of the entries of a square matrix. It characterizes some properties of the matrix and the linear map represented by the matrix. In particular, the determinant is nonzero if a ...
of a Hankel matrix is called a
catalecticant In mathematical invariant theory, the catalecticant of a form of even degree is a polynomial in its coefficients that vanishes when the form is a sum of an unusually small number of powers of linear forms. It was introduced by ; see . The word c ...
.


Hankel matrix transform

The Hankel matrix transform, or simply Hankel transform, produces the sequence of the
determinant In mathematics, the determinant is a scalar value that is a function of the entries of a square matrix. It characterizes some properties of the matrix and the linear map represented by the matrix. In particular, the determinant is nonzero if a ...
s of the Hankel matrices formed from the given sequence. Namely, the sequence \_ is the Hankel transform of the sequence \_ when h_n = \det (b_)_. The Hankel transform is invariant under the
binomial transform In combinatorics, the binomial transform is a sequence transformation (i.e., a transform of a sequence) that computes its forward differences. It is closely related to the Euler transform, which is the result of applying the binomial transform to t ...
of a sequence. That is, if one writes c_n = \sum_^n b_k as the binomial transform of the sequence \, then one has \det (b_)_ = \det (c_)_.


Applications of Hankel matrices

Hankel matrices are formed when, given a sequence of output data, a realization of an underlying state-space or
hidden Markov model A hidden Markov model (HMM) is a statistical Markov model in which the system being modeled is assumed to be a Markov process — call it X — with unobservable ("''hidden''") states. As part of the definition, HMM requires that there be an o ...
is desired. The
singular value decomposition In linear algebra, the singular value decomposition (SVD) is a factorization of a real or complex matrix. It generalizes the eigendecomposition of a square normal matrix with an orthonormal eigenbasis to any \ m \times n\ matrix. It is re ...
of the Hankel matrix provides a means of computing the ''A'', ''B'', and ''C'' matrices which define the state-space realization. The Hankel matrix formed from the signal has been found useful for decomposition of non-stationary signals and time-frequency representation.


Method of moments for polynomial distributions

The method of moments applied to polynomial distributions results in a Hankel matrix that needs to be inverted in order to obtain the weight parameters of the polynomial distribution approximation.J. Munkhammar, L. Mattsson, J. Rydén (2017) "Polynomial probability distribution estimation using the method of moments". PLoS ONE 12(4): e0174573. https://doi.org/10.1371/journal.pone.0174573


Positive Hankel matrices and the Hamburger moment problems


See also

*
Toeplitz matrix In linear algebra, a Toeplitz matrix or diagonal-constant matrix, named after Otto Toeplitz, is a matrix in which each descending diagonal from left to right is constant. For instance, the following matrix is a Toeplitz matrix: :\qquad\begin a & b ...
, an "upside down" (i.e., row-reversed) Hankel matrix *
Cauchy matrix In mathematics, a Cauchy matrix, named after Augustin-Louis Cauchy, is an ''m''×''n'' matrix with elements ''a'ij'' in the form : a_=;\quad x_i-y_j\neq 0,\quad 1 \le i \le m,\quad 1 \le j \le n where x_i and y_j are elements of a field \math ...
*
Vandermonde matrix In linear algebra, a Vandermonde matrix, named after Alexandre-Théophile Vandermonde, is a matrix with the terms of a geometric progression in each row: an matrix :V=\begin 1 & x_1 & x_1^2 & \dots & x_1^\\ 1 & x_2 & x_2^2 & \dots & x_2^\\ 1 & x_ ...


Notes


References

* Brent R.P. (1999), "Stability of fast algorithms for structured linear systems", ''Fast Reliable Algorithms for Matrices with Structure'' (editors—T. Kailath, A.H. Sayed), ch.4 (
SIAM Thailand ( ), historically known as Siam () and officially the Kingdom of Thailand, is a country in Southeast Asia, located at the centre of the Indochinese Peninsula, spanning , with a population of almost 70 million. The country is bo ...
). * * {{Authority control Matrices Transforms